BlackRock — Data Engineer, Python and SQL, Associate

Posted: 30-06-2025

Description


About BlackRock:

BlackRock is the world’s largest asset manager, committed to helping people experience financial well-being. The firm serves clients worldwide, helping them save for retirement, pay for education, buy homes, and start businesses. BlackRock’s investments fuel economic growth, finance infrastructure, and foster innovation.

At BlackRock, people are at the core of the mission. The firm invests heavily in its employees, ensuring they feel welcomed, valued, and supported through professional development, strong networks, and robust benefits.

About This Role:

BlackRock’s Aladdin Engineering team is seeking a skilled Data Engineer to join its Regulatory Tech group. The Regulatory Tech team builds a comprehensive surveillance platform for compliance, helping protect BlackRock from risks such as market manipulation, fraud, and other financial misconduct. This platform is widely used within BlackRock and is undergoing significant feature enhancements to make it available to external clients.

As part of this innovative team, you’ll contribute to a culture defined by curiosity, bravery, passion, openness, and innovation. You’ll work in a collaborative environment where solving complex problems and delivering high-impact solutions is the norm.

Key Responsibilities:

  • Collaborate with engineers, project managers, technical leads, business owners, and analysts throughout the software development lifecycle (SDLC).
  • Design and implement new features for the core product’s data platform and suspicious activity detection mechanisms.
  • Contribute ideas for improving platform resiliency, stability, and performance.
  • Help define and document coding standards and best practices.
  • Participate in code reviews and development discussions to uphold high-quality engineering practices.

Key Technical Skills:

Python, SQL, Snowflake, Airflow, GIT, CI/CD, Unit Testing, End-to-End Testing, Data Engineering

Requirements:

  • Minimum 3+ years of hands-on experience with Python and SQL.
  • Experience working with Snowflake databases.
  • Experience using Airflow for orchestration and data pipeline management.
  • Familiarity with version control systems like GIT, and proficiency in CI/CD pipelines.
  • Interest and foundational knowledge in data engineering.
  • Strong verbal and written communication skills for collaborating with technical and non-technical stakeholders.

Nice to have:

  • Experience with DBT or Great Expectations frameworks.
  • Knowledge of Big Data technologies like Spark, Sqoop, HDFS, and YARN.
  • Experience working in Agile development environments.

Important Notice:

This job description and related content are owned by BlackRock. We are only sharing this information to help job seekers find opportunities. For application procedures, status, or any related concerns, please contact BlackRock directly. We do not process applications or respond to candidate queries.